Efficient Preconditioning for Noisy Separable NMFs by Successive Projection Based Low-Rank Approximations

نویسندگان

  • Tomohiko Mizutani
  • Mirai Tanaka
چکیده

The successive projection algorithm (SPA) can quickly solve a nonnegative matrix factorization problem under a separability assumption. Even if noise is added to the problem, SPA is robust as long as the perturbations caused by the noise are small. In particular, robustness against noise should be high when handling the problems arising from real applications. The preconditioner proposed by Gillis and Vavasis (2015) makes it possible to enhance the noise robustness of SPA. Meanwhile, an additional computational cost is required. The construction of the preconditioner contains a step to compute the top-k truncated singular value decomposition of an input matrix. It is known that the decomposition provides the best rank-k approximation to the input matrix; in other words, a matrix with the smallest approximation error among all matrices of rank less than k. This step is an obstacle to an efficient implementation of the preconditioned SPA. To address the cost issue, we propose a modification of the algorithm for constructing the preconditioner. Although the original algorithm uses the best rank-k approximation, instead of it, our modification uses an alternative. Ideally, this alternative should have high approximation accuracy and low computational cost. To ensure this, our modification employs a rank-k approximation produced by an SPA based algorithm. We analyze the accuracy of the approximation and evaluate the computational cost of the algorithm. We then present an empirical study revealing the actual performance of the SPA based rank-k approximation algorithm and the modified preconditioned SPA.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Semidefinite Programming Based Preconditioning for More Robust Near-Separable Nonnegative Matrix Factorization

Nonnegative matrix factorization (NMF) under the separability assumption can provably be solved efficiently, even in the presence of noise, and has been shown to be a powerful technique in document classification and hyperspectral unmixing. This problem is referred to as near-separable NMF and requires that there exists a cone spanned by a small subset of the columns of the input nonnegative ma...

متن کامل

Finite-Rank Multivariate-Basis Expansions of the Resolvent Operator as a Means of Solving the Multivariable Lippmann–Schwinger Equation for Two-Particle Scattering

Finite-rank expansions of the two-body resolvent operator are explored as a tool for calculating the full three-dimensional two-body T -matrix without invoking the partial-wave decomposition. The separable expansions of the full resolvent that follow from finite-rank approximations of the free resolvent are employed in the Low equation to calculate the T-matrix elements. The finite-rank expansi...

متن کامل

Low Permutation-rank Matrices: Structural Properties and Noisy Completion

We consider the problem of noisy matrix completion, in which the goal is to reconstruct a structured matrix whose entries are partially observed in noise. Standard approaches to this underdetermined inverse problem are based on assuming that the underlying matrix has low rank, or is well-approximated by a low rank matrix. In this paper, we propose a richer model based on what we term the “permu...

متن کامل

Hierarchical low-rank approximation of tensors by successive rank one corrections for preconditioning and solving high dimensional linear systems

We propose an algorithm for preconditioning and solving high dimensional linear systems of equations in tensor format. The algorithm computes an approximation of a tensor in hierarchical Tucker format in a subspace constructed from successive rank one corrections. The algorithm can be applied for the approximation of a solution or of the inverse of an operator. In the latter case, properties su...

متن کامل

Rank Selection in Low-rank Matrix Approximations: A Study of Cross-Validation for NMFs

We consider the problem of model selection in unsupervised statistical learning techniques based on low-rank matrix approximations. While k-fold crossvalidation (CV) has become the standard method of choice for model selection in supervised learning techniques, its adaptation to unsupervised matrix approximation settings has not received sufficient attention in the literature. In this paper, we...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • CoRR

دوره abs/1710.00387  شماره 

صفحات  -

تاریخ انتشار 2017